Algorithms and Complexity for Continuous Problems
نویسندگان
چکیده
s of Talks (in alphabetical order by speaker’s surname) Probabilistic Analysis of Interior Point Methods for Linear Programming Karl Heinz Borgwardt joint work with Petra Huhn It is our aim to make a fair comparison of the Simplex Method and Interior Point Methods on the basis of their average case behavior in the solution of Linear Programming problems possible. Therefore we use the same stochastic model (the Rotation Symmetry Model ) for the average case analysis of both solution methods. Interior Point Methods run in three Phases. In Phase I it is the aim to get close to the analytic center of the feasible region. In Phase IIa we perform an iteration process which reduces the distance to the optimum at least with linear convergence. And in Phase IIb one starts a search for a closeby vertex from the last iteration point of Phase IIa. We demonstrate that for worst-case and for average-case behavior the following geometric figures determine the number of iterations: 1. The maximal distance of a vertex to the origin (the maximal vertex norm) is the crucial measure for the effort in Phase I. 2. The difference of the objective values at the best and the second best vertex determines the number of iterations required in Phase IIa. In worst case polynomiality-proofs these two figures can only be bounded by use of the encoding length L of the problem. And this is the reason why worst case bounds contain this extremely high factor and that polynomiality but not strong polynomiality can be shown. In the average case analysis we calculate the distribution functions of the two geometric figures mentioned above. And this can be via the evaluation of integral formulas used to achieve bounds on the expected number of iterations for all Phases. Our result of that stochastic analysis leads to a proof that the expected number of iterations for the whole method is not only polynomial in the encoding length but also strongly polynomial in the dimensions of the problem. Impossibility of Exponential Convergence Rate for Optimization on the Wiener Space Jim Calvin It is possible to approximate the minimum of a unimodal function on an interval with the worst case error converging to zero at an exponential rate (for example, using the Fibonacci search algorithm). Without strong assumptions such as unimodality, no such worst case bound is available. In a stochastic setting we can obtain probabilistic bounds for large classes of functions. Let P be the Wiener measure, and choose a sequence of numbers cn 0 arbitrarily slowly. Then there exists an algorithm such that P ∆n exp cnn 1 where ∆n is the approximation error after n observations. The purpose of this talk is to show that P ∆n exp cn 0 for any algorithm and any c 0.
منابع مشابه
Gradient-based Ant Colony Optimization for Continuous Spaces
A novel version of Ant Colony Optimization (ACO) algorithms for solving continuous space problems is presented in this paper. The basic structure and concepts of the originally reported ACO are preserved and adaptation of the algorithm to the case of continuous space is implemented within the general framework. The stigmergic communication is simulated through considering certain direction vect...
متن کاملGradient-based Ant Colony Optimization for Continuous Spaces
A novel version of Ant Colony Optimization (ACO) algorithms for solving continuous space problems is presented in this paper. The basic structure and concepts of the originally reported ACO are preserved and adaptation of the algorithm to the case of continuous space is implemented within the general framework. The stigmergic communication is simulated through considering certain direction vect...
متن کاملAn Improved Bat Algorithm with Grey Wolf Optimizer for Solving Continuous Optimization Problems
Metaheuristic algorithms are used to solve NP-hard optimization problems. These algorithms have two main components, i.e. exploration and exploitation, and try to strike a balance between exploration and exploitation to achieve the best possible near-optimal solution. The bat algorithm is one of the metaheuristic algorithms with poor exploration and exploitation. In this paper, exploration and ...
متن کاملPERFORMANCE OF DIFFERENT ANT-BASED ALGORITHMS FOR OPTIMIZATION OF MIXED VARIABLE DOMAIN IN CIVIL ENGINEERING DESIGNS
Ant colony optimization algorithms (ACOs) have been basically introduced to discrete variable problems and applied to different research domains in several engineering fields. Meanwhile, abundant studies have been already involved to adapt different ant models to continuous search spaces. Assessments indicate competitive performance of ACOs on discrete or continuous domains. Therefore, as poten...
متن کاملA Continuous Plane Model to Machine Layout Problems Considering Pick-Up and Drop-Off Points: An Evolutionary Algorithm
One of the well-known evolutionary algorithms inspired by biological evolution is genetic algorithm (GA) that is employed as a robust and global optimization tool to search for the best or near-optimal solution with the search space. In this paper, this algorithm is used to solve unequalsized machines (or intra-cell) layout problems considering pick-up and drop-off (input/output) points. Such p...
متن کاملA New Approach to Solve N-Queen Problem with Parallel Genetic Algorithm
Over the past few decades great efforts were made to solve uncertain hybrid optimization problems. The n-Queen problem is one of such problems that many solutions have been proposed for. The traditional methods to solve this problem are exponential in terms of runtime and are not acceptable in terms of space and memory complexity. In this study, parallel genetic algorithms are proposed to solve...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001